3,917 research outputs found

    27 pawns ready for action: A multi-indicator methodology and evaluation of thesaurus management tools from a LOD perspective

    Get PDF
    Purpose – The purpose of this paper is to propose a methodology for assessing thesauri and other controlled vocabularies management tools that can represent content using the Simple Knowledge Organization System (SKOS) data model, and their use in a Linked Open Data (LOD) paradigm. It effectively analyses selected set of tools in order to prove the validity of the method. Design/methodology/approach – A set of 27 criteria grouped in five evaluation indicators is proposed and applied to ten vocabulary management applications which are compliant with the SKOS data model. Previous studies of controlled vocabulary management software are gathered and analyzed, to compare the evaluation parameters used and the results obtained for each tool. Findings – The results indicate that the tool that obtains the highest score in every indicator is Poolparty. The second and third tools are, respectively, TemaTres and Intelligent Theme Manager, but scoring lower in most of the evaluation items. The use of a broad set of criteria to evaluate vocabularies management tools gives satisfactory results. The set of five indicators and 27 criteria proposed here represents a useful evaluation system in the selection of current and future tools to manage vocabularies. Research limitations/implications – The paper only assesses the ten most important/well know software tools applied for thesaurus and vocabulary management until October 2016. However, the evaluation criteria could be applied to new software that could appear in the future to create/manage SKOS vocabularies in compliance with LOD standards. Originality/value – The originality of this paper relies on the proposed indicators and criteria to evaluate vocabulary management tools. Those criteria and indicators can be valuable also for future software that might appear. The indicators are also applied to the most exhaustive and qualified list of this kind of tools. The paper will help designers, information architects, metadata librarians, and other staff involved in the design of digital information systems, to choose the right tool to manage their vocabularies in a LOD/vocabulary scenario

    Development of Detailed Clinical Models for Nursing Assessments and Nursing Interventions

    Get PDF
    Objectives: The aim of this study was to develop and validate Detailed Clinical Models (DCMs) for nursing assessments and interventions. Methods: First, we identified the nursing assessment and nursing intervention entities. Second, we identified the attributes and the attribute values in order to describe the entities in more detail. The data type and optionality of the attributes were then defined. Third, the entities, attributes and value sets in the DCMs were mapped to the International Classification for Nursing Practice Version 2 concepts. Finally, the DCMs were validated by domain experts and applied to case reports. Results: In total 481 DCMs, 429 DCMs for nursing assessments and 52 DCMs for nursing interventions, were developed and validated. The DCMs developed in this study were found to be sufficiently comprehensive in representing the clinical concepts of nursing assessments and interventions. Conclusions: The DCMs developed in this study can be used in electronic nursing records. These DCMs can be used to ensure the semantic interoperability of the nursing information documented in electronic nursing records

    An integrated view of data quality in Earth observation

    Get PDF
    Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research

    Machining centre performance monitoring with calibrated artefact probing

    Get PDF
    Maintaining high levels of geometric accuracy in five-axis machining centres is of critical importance to many industries and applications. Numerous methods for error identification have been developed in both the academic and industrial fields; one commonly-applied technique is artefact probing, which can reveal inherent system errors at minimal cost and does not require high skill levels to perform. The primary focus of popular commercial solutions is on confirming machine capability to produce accurate workpieces, with the potential for short-term trend analysis and fault diagnosis through interpretation of the results by an experienced user. This paper considers expanding the artefact probing method into a performance monitoring system, benefitting both the onsite Maintenance Engineer and visiting specialist Engineer with accessibility of information and more effective means to form insight. A technique for constructing a data-driven tolerance threshold is introduced, describing the normal operating condition and helping protect against unwarranted settings induced by human error. A multifunctional graphical element is developed to present the data trends with tolerance threshold integration to maintain relevant performance context, and an automated event detector to highlight areas of interest or concern. The methods were developed on a simulated, demonstration dataset; then applied without modification to three case studies on data acquired from currently operating industrial machining centres to verify the methods. The data-driven tolerance threshold and event detector methods were shown to be effective at their respective tasks, and the merits of the multifunctional graphical display are presented and discussed

    Applicability of the ISO Reference Terminology Model for Nursing to the Detailed Clinical Models of Perinatal Care Nursing Assessments

    Get PDF
    Objectives: The purpose of this study was to examine the applicability of the International Organization for Standardization (ISO) reference terminology model for nursing to describe the terminological value domain content regarding the entities and attributes of the detailed clinical models (DCMs) used for nursing assessments. Methods: The first author mapped 52 DCM entities and 45 DCM attributes used for perinatal care nursing assessments to semantic domains and their qualifiers to the ISO model. The mapping results of the entity and attribute concepts were classified into four categories: mapped to a semantic domain qualifier, mapped to a semantic domain, mapped to a broader semantic domain concept, and not mapped. The DCM mapping results were classified into three categories: fully mapped, partially mapped, and not mapped. The second author verified the mapping. Results: All of the entities and 53.3 % of the attribute concepts of the DCMs were mapped to semantic domains or semantic domain qualifiers of the ISO model, 37.8 % of the attributes were mapped to the broader semantic domain concept, and 8.9 % of the attributes were not mapped. At the model level, 48.1 % of the DCMs were fully mapped to semantic domains or semantic domain qualifiers of the ISO model, and 51.9 % of the DCMs were partially mapped. Conclusions: The findings of this study demonstrate that the ISO reference terminology model for nursing is applicable in representing the DCM structure for perinatal care nursing assessment. However, more qualifiers of the Judgment semantic domain are required in orde

    Quantitative estimation of sampling uncertainties for mycotoxins in cereal shipments

    Get PDF
    Many countries receive shipments of bulk cereals from primary producers. There is a volume of work that is ongoing that seeks to arrive at appropriate standards for the quality of the shipments and the means to assess the shipments as they are out-loaded. Of concern are mycotoxin and heavy metal levels, pesticide and herbicide residue levels, and contamination by genetically modified organisms (GMOs). As the ability to quantify these contaminants improves through improved analytical techniques, the sampling methodologies applied to the shipments must also keep pace to ensure that the uncertainties attached to the sampling procedures do not overwhelm the analytical uncertainties. There is a need to understand and quantify sampling uncertainties under varying conditions of contamination. The analysis required is statistical and is challenging as the nature of the distribution of contaminants within a shipment is not well understood; very limited data exist. Limited work has been undertaken to quantify the variability of the contaminant concentrations in the flow of grain coming from a ship and the impact that this has on the variance of sampling. Relatively recent work by Paoletti et al. in 2006 [Paoletti C, Heissenberger A, Mazzara M, Larcher S, Grazioli E, Corbisier P, Hess N, Berben G, Lubeck PS, De Loose M, et al. 2006. Kernel lot distribution assessment (KeLDA): a study on the distribution of GMO in large soybean shipments. Eur Food Res Tech. 224:129–139] provides some insight into the variation in GMO concentrations in soybeans on cargo out-turn. Paoletti et al. analysed the data using correlogram analysis with the objective of quantifying the sampling uncertainty (variance) that attaches to the final cargo analysis, but this is only one possible means of quantifying sampling uncertainty. It is possible that in many cases the levels of contamination passing the sampler on out-loading are essentially random, negating the value of variographic quantitation of the sampling variance. GMOs and mycotoxins appear to have a highly heterogeneous distribution in a cargo depending on how the ship was loaded (the grain may have come from more than one terminal and set of storage silos) and mycotoxin growth may have occurred in transit. This paper examines a statistical model based on random contamination that can be used to calculate the sampling uncertainty arising from primary sampling of a cargo; it deals with what is thought to be a worst-case scenario. The determination of the sampling variance is treated both analytically and by Monte Carlo simulation. The latter approach provides the entire sampling distribution and not just the sampling variance. The sampling procedure is based on rules provided by the Canadian Grain Commission (CGC) and the levels of contamination considered are those relating to allowable levels of ochratoxin A (OTA) in wheat. The results of the calculations indicate that at a loading rate of 1000 tonnes h-1, primary sample increment masses of 10.6 kg, a 2000-tonne lot and a primary composite sample mass of 1900 kg, the relative standard deviation (RSD) is about 1.05 (105%) and the distribution of the mycotoxin (MT) level in the primary composite samples is highly skewed. This result applies to a mean MT level of 2 ng g-1. The rate of false-negative results under these conditions is estimated to be 16.2%. The corresponding contamination is based on initial average concentrations of MT of 4000 ng g-1 within average spherical volumes of 0.3m diameter, which are then diluted by a factor of 2 each time they pass through a handling stage; four stages of handling are assumed. The Monte Carlo calculations allow for variation in the initial volume of the MT-bearing grain, the average concentration and the dilution factor. The Monte Carlo studies seek to show the effect of variation in the sampling frequency while maintaining a primary composite sample mass of 1900 kg. The overall results are presented in terms of operational characteristic curves that relate only to the sampling uncertainties in the primary sampling of the grain. It is concluded that cross-stream sampling is intrinsically unsuited to sampling for mycotoxins and that better sampling methods and equipment are needed to control sampling uncertainties. At the same time, it is shown that some combination of crosscutting sampling conditions may, for a given shipment mass and MT content, yield acceptable sampling performance

    An ontology for integrated machining and inspection process planning focusing on resource capabilities

    Get PDF
    The search for and assignment of resources is extremely important for the efficient planning of any process in a distributed environment, such as the collaborative product integrated development process. These environments require a degree of semantic interoperability, which currently can only be provided by ontological models. However, the ontological proposals centred on Resources for Machining and nspection Process Planning have a limited reach, do not adopt a unified view of machining and inspection, and fail to express knowledge in the manner required by some of the planning tasks, as is the case with those concerned with resource assignment and plan validation. With the aim of providing a solution to these shortcomings the manufacturing and inspection resource capability (MIRC) ontology has been developed, as a specialist offshoot of the product and processes development resources capability ontology. This ontology considers resource capabilities to be a characteristic of the resource executing any activity present in an integrated process plan. Special attention is given to resource preparation activities, due to their influence on the quality of the final product. After describing the MIRC ontology, a case study demonstrates how the ontology supports the process planning for any level, approach or plan strategy.This work has been possible thanks to the funding received from the Spanish Ministry of Science and Education through the COAPP Research Project [reference DPI2007-66871-C02-01/02].Solano GarcĂ­a, L.; Romero SubirĂłn, F.; Rosado Castellano, P. (2016). An ontology for integrated machining and inspection process planning focusing on resource capabilities. International Journal of Computer Integrated Manufacturing. 29(1):1-15. doi:10.1080/0951192X.2014.1003149S11529
    • …
    corecore